TokRepo
InicioTendenciasGuíasAutores
←Volver a autores
LiteLLM (BerriAI)

LiteLLM (BerriAI)

Se unió en marzo de 2026
3 activos·0 estrellas obtenidas·27 vistas totales
⚡

Workflows

1

LiteLLM Proxy — Unified Gateway for 100+ LLM APIs

LiteLLM Proxy maps 100+ LLM providers (Anthropic, OpenAI, Bedrock, Vertex) to one OpenAI-compatible endpoint. Auth, rate limit, cost track, fallbacks.

May 7, 2026
7
📜

Scripts

1

LiteLLM Router — Smart Failover & Load Balancing in Python

LiteLLM Router routes LLM endpoints with retry, fallback, latency-based, weighted A/B. Pure Python — drop into any codebase, no separate proxy needed.

May 7, 2026
6
📚

Knowledge

1

LiteLLM Cost Tracking — Per-Project LLM Spend Dashboard

LiteLLM ships a built-in cost dashboard. Track LLM spend by project, user, model, tag. Hard budgets that block at the proxy. SOC2 / SSO via Pro tier.

May 7, 2026
14
◈Inicio🔍Buscar👤Yo
TokRepo

© 2026 TokRepo. Todos los derechos reservados.

GuíasAcerca dePrivacidadAyudaTwitter

軒轅十四株式会社 · Tokyo, Japan

〒101-0032 Tokyo, Chiyoda-ku, Iwamotocho 2-chome

Contact: ethanfrostcool@gmail.com